Maximum A Posteriori Maximum Entropy Order Determination - Signal Processing, IEEE Transactions on
نویسنده
چکیده
An instance crucial to most problems in signal processing is the selection of the order of a presupposed model. Examples are the determination of the putative number of signals present in white Gaussian noise or the number of noisecontaminated sources impinging on a passive sensor array. It is shown that maximum a posteriori Bayesian arguments, coupled with maximum entropy considerations, offer an operational and consistent model order selection scheme, competitive with the minimum description length criterion.
منابع مشابه
Maximum a posteriori maximum entropy order determination
An instance crucial to most problems in signal processing is the selection of the order of a presupposed model. Examples are the determination of the putative number of signals present in white Gaussian noise or the number of noise-contaminated sources impinging on a passive sensor array. It is shown that Maximum a Posteriori Bayesian arguments, coupled with Maximum Entropy considerations, offe...
متن کاملThresholding using two-dimensional histogram and fuzzy entropy principle
This paper presents a thresholding approach by performing fuzzy partition on a two-dimensional (2-D) histogram based on fuzzy relation and maximum fuzzy entropy principle. The experiments with various gray level and color images have demonstrated that the proposed approach outperforms the 2-D nonfuzzy approach and the one dimensional (1-D) fuzzy partition approach.
متن کاملImage Bit-Depth Enhancement via Maximum A Posteriori Estimation of AC Signal
When images at low bit-depth are rendered at high bit-depth displays, missing least significant bits need to be estimated. We study the image bit-depth enhancement problem: estimating an original image from its quantized version from a minimum mean squared error (MMSE) perspective. We first argue that a graph-signal smoothness prior-one defined on a graph embedding the image structure-is an app...
متن کاملIEEE TRANSACTIONS ON INFORMATION THEORY VOL NO JANUARY Maximum Independence and Mutual Information
If I I Ik are random boolean variables and the joint probabilities up to the k st order are known the values of the k th order probabilities maximizing the overall entropy have been de ned as the maximum independence estimate In the paper some contributions deriving from the de ni tion of maximum independence probabilities are proposed First it is shown that the maximum independence values are ...
متن کاملMean and variance of implicitly defined biased estimators (such as penalized maximum likelihood): applications to tomography
Many estimators in signal processing problems are defined implicitly as the maximum of some objective function. Examples of implicitly defined estimators include maximum likelihood, penalized likelihood, maximum a posteriori, and nonlinear least squares estimation. For such estimators, exact analytical expressions for the mean and variance are usually unavailable. Therefore, investigators usual...
متن کامل